Rayleigh Quotient Based Optimization Methods For Eigenvalue Problems

نویسنده

  • Ren-Cang Li
چکیده

Four classes of eigenvalue problems that admit similar min-max principles and the Cauchy interlacing inequalities as the symmetric eigenvalue problem famously does are investigated. These min-max principles pave ways for efficient numerical solutions for extreme eigenpairs by optimizing the so-called Rayleigh quotient functions. In fact, scientists and engineers have already been doing that for computing the eigenvalues and eigenvectors of Hermitian matrix pencils A − λB with B positive definite, the first class of our eigenvalue problems. But little attention has gone to the other three classes: positive semidefinite pencils, linear response eigenvalue problems, and hyperbolic eigenvalue problems, in part because most min-max principles for the latter were discovered only very recently and some more are being discovered. It is expected that they will drive the effort to design better optimization based numerical methods for years to come.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Geometric Optimization Methods for Adaptive Filtering

The techniques and analysis presented in this thesis provide new methods to solve optimization problems posed on Riemannian manifolds. These methods are applied to the subspace tracking problem found in adaptive signal processing and adaptive control. A new point of view is offered for the constrained optimization problem. Some classical optimization techniques on Euclidean space are generalize...

متن کامل

Robust Rayleigh quotient minimization and nonlinear eigenvalue problems

In this paper, we study the robust Rayleigh quotient optimization problems that arise when optimizing in worst-case the Rayleigh quotient of data matrices subject to uncertainties. We propose to solve such problem by exploiting its characterization as a nonlinear eigenvalue problem with eigenvector nonlinearity. With this approach, we can show that a commonly used iterative method can be diverg...

متن کامل

Eigenvalue Computation from the Optimization Perspective: On Jacobi-Davidson, IIGD, RQI and Newton Updates

We discuss the close connection between eigenvalue computation and optimization using the Newton method and subspace methods. From the connection we derive a new class of Newton updates. The new update formulation is similar to the well-known Jacobi-Davidson method. This similarity leads to simplified versions of the Jacobi-Davidson method and the inverse iteration generalized Davidson (IIGD) m...

متن کامل

Rayleigh Quotient Iteration and Simplified Jacobi-davidson with Preconditioned Iterative Solves for Generalised Eigenvalue Problems

The computation of a right eigenvector and corresponding finite eigenvalue of a large sparse generalised eigenproblem Ax = λMx using preconditioned Rayleigh quotient iteration and the simplified JacobiDavidson method is considered. Both methods are inner-outer iterative methods and we consider GMRES and FOM as iterative algorithms for the (inexact) solution of the inner systems that arise. The ...

متن کامل

A Jacobi-Davidson Method for Solving Complex Symmetric Eigenvalue Problems

We discuss variants of the Jacobi–Davidson method for solving the generalized complex-symmetric eigenvalue problem. The Jacobi–Davidson algorithm can be considered as an accelerated inexact Rayleigh quotient iteration. We show that it is appropriate to replace the Euclidean inner product xy in C by the bilinear form x y. The Rayleigh quotient based on this bilinear form leads to an asymptotical...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013